Soft Margin Training for Associative Memories Implemented by Recurrent Neural Networks

نویسندگان

  • José Antonio Ruz Hernández
  • Edgar N. Sánchez
  • Dionisio A. Suarez
چکیده

In this paper, the authors discuss a new synthesis approach to train associative memories, based on recurrent neural networks (RNNs). They propose to use soft margin training for associative memories, which is efficient when training patterns are not all linearly separable. On the basis of the soft margin algorithm used to train support vector machines (SVMs), the new algorithm is developed in order to improve the obtained results via optimal training algorithm also innovated by the authors, which are not fully satisfactory due to that some times the training patterns are not all linearly separable. This new algorithm is used for the synthesis of an associative memory considering the design based on a RNN with the connection matrix having upper bounds on the diagonal elements to reduce the total number of spurious memory. The scheme is evaluated via a full scale simulator to diagnose the main faults occurred in fossil electric power plants.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A neural network with a single recurrent unit for associative memories based on linear optimization

Recently, some continuous-time recurrent neural networks have been proposed for associative memories based on optimizing linear or quadratic programming problems. In this paper, a simple and efficient neural network with a single recurrent unit is proposed for realizing associative memories. Compared with the existing neural networks for associative memories, the main advantage of the proposed ...

متن کامل

Continuous Attractors in Recurrent Neural Networks and Phase Space Learning

Recurrent networks can be used as associative memories where the stored memories represent fixed points to which the dynamics of the network converges. These networks, however, also can present continuous attractors, as limit cycles and chaotic attractors. The use of these attractors in recurrent networks for the construction of associative memories is argued. Here, we provide a training algori...

متن کامل

Neural Network Applications F1.4 Associative memory

This section considers how neural networks can be used as associative memory devices. It first describes what an associative memory is, and then moves on to describe associative memories based on feedforward neural networks and associative memories based on recurrent networks. The section also describes associative memory systems based on cognitive models. It also highlights the ability of neur...

متن کامل

Face Recognition Using Recurrent High-Order Associative Memories

A novel face recognition approach is proposed, based on the use of compressed discriminative features and recurrent neural classifiers. Low-dimensional feature vectors are extracted through a combined effect of wavelet decomposition and subspace projections. The classifier is implemented as a special gradient-type recurrent analog neural network acting as an associative memory. The system exhib...

متن کامل

Fast Weight Long Short-term Memory

Associative memory using fast weights is a short-term memory mechanism that substantially improves the memory capacity and time scale of recurrent neural networks (RNNs). As recent studies introduced fast weights only to regular RNNs, it is unknown whether fast weight memory is beneficial to gated RNNs. In this work, we report a significant synergy between long short-term memory (LSTM) networks...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007